Optimal Training Parameters and Hidden Layer Neuron Number of Two-Layer Perceptron for Generalised Scaled Object Classification Problem

نویسندگان

چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Network Packet Classification using Neural Network based on Training Function and Hidden Layer Neuron Number Variation

Distributed denial of service (DDoS) is a structured network attack coming from various sources and fused to form a large packet stream. DDoS packet stream pattern behaves as normal packet stream pattern and very difficult to distinguish between DDoS and normal packet stream. Network packet classification is one of the network defense system in order to avoid DDoS attacks. Artificial Neural Net...

متن کامل

Limitations of One-Hidden-Layer Perceptron Networks

Limitations of one-hidden-layer perceptron networks to represent efficiently finite mappings is investigated. It is shown that almost any uniformly randomly chosen mapping on a sufficiently large finite domain cannot be tractably represented by a one-hidden-layer perceptron network. This existential probabilistic result is complemented by a concrete example of a class of functions constructed u...

متن کامل

Multiple Layer Perceptron training using genetic algorithms

Multiple Layer Perceptron networks trained with backpropagation algorithm are very frequently used to solve a wide variety of real-world problems. Usually a gradient descent algorithm is used to adapt the weights based on a comparison between the desired and actual network response to a given input stimulus. All training pairs, each consisting of input vector and desired output vector, are form...

متن کامل

Fourier Analysis and Filtering of aSingle Hidden Layer Perceptron

| We show that the Fourier transform of the linear output of a single hidden layer perceptron consists of a multitude of line masses passing through the origin. Each line corresponds to one of the hidden neurons and its slope is determined by that neuron's weight vector. We also show that convolving the output of the network with a function can be achieved simply by modifying the shape of the s...

متن کامل

Bounds on Sparsity of One-Hidden-Layer Perceptron Networks

Limitations of one-hidden-layer (shallow) perceptron networks to sparsely represent multivariable functions is investigated. A concrete class of functions is described whose computation by shallow perceptron networks requires either large number of units or is unstable due to large output weights. The class is constructed using pseudo-noise sequences which have many features of random sequences...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Information Technology and Management Science

سال: 2015

ISSN: 2255-9094

DOI: 10.1515/itms-2015-0007